Pre-trained transformer-based language models for Sundanese

نویسندگان

چکیده

Abstract The Sundanese language has over 32 million speakers worldwide, but the reaped little to no benefits from recent advances in natural understanding. Like other low-resource languages, only alternative is fine-tune existing multilingual models. In this paper, we pre-trained three monolingual Transformer-based models on data. When evaluated a downstream text classification task, found that most of our outperformed larger despite smaller overall pre-training subsequent analyses, benefited strongly corpus size and do not exhibit socially biased behavior. We released for researchers practitioners use.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Pre-Trained Ensemble Model for Breast Cancer Grade Detection Based on Small Datasets

Background and Purpose: Nowadays, breast cancer is reported as one of the most common cancers amongst women. Early detection of the cancer type is essential to aid in informing subsequent treatments. The newest proposed breast cancer detectors are based on deep learning. Most of these works focus on large-datasets and are not developed for small datasets. Although the large datasets might lead ...

متن کامل

ImageNet pre-trained models with batch normalization

Convolutional neural networks (CNN) pre-trained on ImageNet are the backbone of most state-of-the-art approaches. In this paper, we present a new set of pretrained models with popular state-of-the-art architectures for the Caffe framework. The first release includes Residual Networks (ResNets) with generation script as well as the batch-normalization-variants of AlexNet and VGG19. All models ou...

متن کامل

the effect of lexically based language teaching (lblt) on vocabulary learning among iranian pre-university students

هدف پژوهش حاضر بررسی تاثیر روش تدریس واژگانی (واژه-محور) بر یادگیری لغات در بین دانش آموزان دوره پیش دانشگاهی است. بدین منظور دو گروه از دانش آموزان دوره پیش دانشگاهی (شصت نفر) که در سال تحصیلی 1389 در شهرستان نور آباد استان لرستان مشغول به تحصیل بودند انتخاب شده و به صورت قراردادی گروه آزمایش و گواه در نظر گرفته شدند. در ابتدا به منظور اطمینان یافتن از میزان همگن بودن دو گروه از دانش واژگان، آ...

15 صفحه اول

Automatic Analysis of Flaws in Pre-Trained NLP Models

Most tools for natural language processing (NLP) today are based on machine learning and come with pre-trained models. In addition, third-parties provide pre-trained models for popular NLP tools. The predictive power and accuracy of these tools depends on the quality of these models. Downstream researchers often base their results on pre-trained models instead of training their own. Consequentl...

متن کامل

Unregistered Multiview Mammogram Analysis with Pre-trained Deep Learning Models

We show two important findings on the use of deep convolutional neural networks (CNN) in medical image analysis. First, we show that CNN models that are pre-trained using computer vision databases (e.g., Imagenet) are useful in medical image applications, despite the significant differences in image appearance. Second, we show that multiview classification is possible without the pre-registrati...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Big Data

سال: 2022

ISSN: ['2196-1115']

DOI: https://doi.org/10.1186/s40537-022-00590-7